# Semantic Similarity Calculation

Medical Embedded V2
Apache-2.0
This is a multilingual sentence embedding model capable of mapping sentences and paragraphs into a 512-dimensional dense vector space, suitable for tasks such as clustering and semantic search.
Text Embedding Supports Multiple Languages
M
shtilev
516
1
Gte Multilingual Reranker Base Onnx Op14 Opt Gpu Int8
MIT
This is the quantized ONNX version of Alibaba-NLP/gte-multilingual-reranker-base, utilizing INT8 quantization, optimized for GPU, and suitable for text classification tasks.
Text Embedding Other
G
JustJaro
91
1
Bge M3 Legal Retrieval
Apache-2.0
A fine-tuned version of the bge-m3 model optimized for legal document retrieval tasks, specifically designed to handle the nuances of legal language, enabling accurate and efficient retrieval of relevant legal documents based on semantic similarity.
Text Embedding English
B
Quintu
172
1
Entity Matching Jobs
This is a model trained based on the sentence-transformers framework, specifically designed for semantic matching and similarity calculation of job titles.
Text Embedding
E
engineai
47
2
Modernbert Base ColBERT
This is a PyLate model fine-tuned from answerdotai/ModernBERT-base on the MS-MARCO dataset, designed for sentence similarity calculation and document retrieval.
Text Embedding English
M
Y-J-Ju
88
7
Multilingual E5 Large Pooled
MIT
Multilingual E5 Large is a multilingual sentence transformer model focused on sentence similarity and feature extraction tasks, supporting multiple languages.
Text Embedding Supports Multiple Languages
M
Hiveurban
3,803
2
Vietnamese Document Embedding
Apache-2.0
A document embedding model for Vietnamese, supporting contexts up to 8096 tokens, trained based on gte-multilingual
Text Embedding Transformers Other
V
dangvantuan
77.61k
15
Nase
Apache-2.0
NaSE is a news domain-specialized multilingual sentence encoder, based on LaBSE with domain-specific training, supporting sentence embedding and similarity calculation for 100+ languages.
Text Embedding Transformers Supports Multiple Languages
N
aiana94
14
3
Roberta Large InBedder
MIT
InBedder is a text embedder specifically designed to follow instructions, capable of capturing text features specified by user instructions through question-answering.
Text Embedding Transformers English
R
BrandonZYW
17
2
GIST Small Embedding V0
MIT
A text embedding model fine-tuned based on BAAI/bge-small-en-v1.5, trained with the MEDI dataset and MTEB classification task datasets, optimized for query encoding in retrieval tasks.
Text Embedding English
G
avsolatorio
945.68k
29
Bge Base En V1.5 Onnx Q
Apache-2.0
BAAI/bge-base-en-v1.5 is a Transformer-based English text embedding model, primarily used for sentence similarity calculation and text classification tasks.
Text Embedding Transformers
B
Qdrant
47.21k
1
Stsb Bert Tiny Safetensors
This is a lightweight sentence embedding model based on the BERT architecture, capable of converting sentences and paragraphs into 128-dimensional dense vectors, suitable for tasks such as semantic similarity calculation.
Text Embedding Transformers
S
sentence-transformers-testing
136.99k
4
Sentence Transformers Multilingual E5 Large
MIT
This is a multilingual sentence embedding model based on sentence-transformers, capable of mapping text to a 1024-dimensional vector space, suitable for semantic search and clustering tasks.
Text Embedding
S
smart-tribune
276
0
Constructionembeddingbert
This is a sentence embedding model based on sentence-transformers, capable of mapping sentences and paragraphs into a 1536-dimensional dense vector space.
Text Embedding
C
ahhany
25
0
Gte Tiny
GTE Tiny is a small general-purpose text embedding model suitable for various natural language processing tasks.
Text Embedding Transformers
G
TaylorAI
74.46k
138
Bge Large En V1.5 Quant
MIT
Quantized (INT8) ONNX variant of BGE-large-en-v1.5 with inference acceleration via DeepSparse
Text Embedding Transformers English
B
RedHatAI
1,094
22
Sup Simcse Ja Base
A Japanese sentence embedding model fine-tuned using supervised SimCSE method, suitable for sentence similarity calculation and feature extraction tasks.
Text Embedding Transformers Japanese
S
cl-nagoya
3,027
2
S DagoBERT STSb
This is a sentence embedding model based on sentence-transformers, capable of mapping sentences and paragraphs into a 768-dimensional dense vector space, suitable for tasks such as sentence similarity calculation, semantic search, and clustering.
Text Embedding Transformers
S
jpostma
13
0
UNSEE CorInfoMax
This is a sentence embedding model based on sentence-transformers, capable of mapping sentences and paragraphs into a 768-dimensional dense vector space, suitable for tasks such as sentence similarity calculation and semantic search.
Text Embedding Transformers
U
asparius
16
0
Gte Small
MIT
GTE-small is a general text embedding model trained by Alibaba DAMO Academy, based on the BERT framework, suitable for tasks such as information retrieval and semantic text similarity.
Text Embedding Transformers English
G
Supabase
481.27k
89
E5 Base Multilingual 4096
E5-base-multilingual-4096 is a locally sparse global version based on intfloat/multilingual-e5-base, supporting multilingual text embedding models that can process up to 4096 tokens.
Text Embedding Transformers Supports Multiple Languages
E
efederici
340
16
Mmarco Mnrl Bert Base Italian Uncased
This is a model based on sentence-transformers that can map sentences and paragraphs into a 768-dimensional dense vector space, suitable for tasks such as sentence similarity calculation and semantic search.
Text Embedding Transformers
M
nickprock
153
1
Transformer
This is a model based on sentence-transformers that maps sentences and paragraphs into a 768-dimensional dense vector space, suitable for tasks such as sentence similarity calculation, clustering, and semantic search.
Text Embedding Transformers
T
kpourdeilami
44
0
Keyphrase Mpnet V1
A sentence transformer model optimized for phrases, mapping phrases into a 768-dimensional dense vector space, suitable for tasks like clustering or semantic search.
Text Embedding Transformers
K
uclanlp
4,278
2
Sentence Transformers Alephbert
This is a Hebrew sentence embedding model based on AlephBERT, capable of mapping sentences and paragraphs into a 768-dimensional vector space, suitable for tasks such as semantic search and clustering.
Text Embedding Transformers Other
S
imvladikon
4,768
7
Contriever Gpl Hotpotqa
This is a model based on sentence-transformers that maps sentences and paragraphs into a 768-dimensional dense vector space, suitable for tasks such as sentence similarity calculation and semantic search.
Text Embedding Transformers
C
income
13
0
Jurimodel
This is a model based on sentence-transformers that maps sentences and paragraphs into a 768-dimensional dense vector space for tasks such as sentence similarity calculation and semantic search.
Text Embedding Transformers
J
ramdane
121
0
Paraphrase Xlm R Multilingual V1 Fine Tuned For Medieval Latin
This is a model based on sentence-transformers that maps sentences and paragraphs into a 768-dimensional dense vector space, suitable for tasks such as sentence similarity calculation, clustering, and semantic search.
Text Embedding Transformers
P
silencesys
66
3
Raw 2 No 1 Test 2 New.model
This is a model based on sentence-transformers that maps sentences and paragraphs into a 768-dimensional dense vector space, suitable for tasks such as sentence similarity calculation and semantic search.
Text Embedding Transformers
R
Wheatley961
13
0
Paraphrastic Test
This is a sentence embedding model based on sentence-transformers, capable of converting text into 1024-dimensional vector representations, suitable for tasks such as sentence similarity calculation and semantic search.
Text Embedding Transformers
P
jwieting
35
0
Tat Model
This is a sentence embedding model based on sentence-transformers, capable of mapping sentences and paragraphs into a 768-dimensional vector space, suitable for tasks such as sentence similarity calculation and semantic search.
Text Embedding
T
mathislucka
22
0
CORD 19 Title Abstracts 1 More Epoch
This is a model based on sentence-transformers that can map sentences and paragraphs into a 384-dimensional dense vector space, suitable for tasks such as clustering or semantic search.
Text Embedding
C
CShorten
13
0
Bmz Topics
This is a sentence embedding model based on sentence-transformers, capable of mapping sentences and paragraphs into a 768-dimensional dense vector space, suitable for tasks such as sentence similarity calculation and semantic search.
Text Embedding Transformers
B
peter2000
13
1
Laprador Pt Pb
A sentence embedding model based on sentence-transformers that maps text to a 768-dimensional vector space
Text Embedding Transformers
L
gemasphi
13
0
Stpushtohub Test
This is a sentence embedding model based on sentence-transformers, capable of mapping text to a 768-dimensional vector space.
Text Embedding Transformers
S
NimaBoscarino
33
0
Laprador Trained
This is a sentence embedding model based on sentence-transformers, capable of mapping sentences and paragraphs into a 768-dimensional vector space, suitable for tasks such as semantic search and clustering.
Text Embedding Transformers
L
gemasphi
31
0
Testmodel
This is a sentence similarity calculation model based on sentence-transformers, capable of mapping text to a 384-dimensional vector space
Text Embedding
T
WalidLak
15
0
Mcontriever Base Msmarco
This is a sentence embedding model based on sentence-transformers, capable of mapping text to a 768-dimensional vector space, suitable for semantic search and clustering tasks.
Text Embedding Transformers
M
nthakur
195
5
Contriever Base Msmarco
This is a version of the Contriever MSMARCO model adapted for the Sentence Transformer framework, capable of mapping text to a 768-dimensional dense vector space, suitable for semantic search and clustering tasks.
Text Embedding Transformers
C
nthakur
2,243
2
HPD TinyBERT F128
Apache-2.0
A sentence embedding model compressed via homomorphic projection distillation, containing only 14 million parameters with a model size of 55MB, suitable for semantic retrieval tasks
Text Embedding Transformers
H
Xuandong
24
1
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase